Fast low rank approximations of matrices and tensors
نویسنده
چکیده
In many applications such as data compression, imaging or genomic data analysis, it is important to approximate a given m × n matrix A by a matrix B of rank at most k which is much smaller than m and n. The best rank k approximation can be determined via the singular value decomposition which, however, has prohibitively high computational complexity and storage requirements for very large m and n. We present an optimal least squares algorithm for computing a rank k approximation to an m×n matrix A by reading only a limited number of rows and columns of A. The algorithm has complexity O(k2 max(m,n)) and allows to iteratively improve given rank k approximations by reading additional rows and columns of A. We also show how this approach can be extended to tensors and present numerical results.
منابع مشابه
Orthogonal Rank-two Tensor Approximation: a Modified High-order Power Method and Its Convergence Analysis
With the notable exceptions that tensors of order 2, that is, matrices always have best approximations of arbitrary low ranks and that tensors of any order always have the best rank-one approximation, it is known that high-order tensors can fail to have best low rank approximations. When the condition of orthogonality is imposed, even at the most general case that only one pair of components in...
متن کاملOrthogonal Low Rank Tensor Approximation: Alternating Least Squares Method and Its Global Convergence
With the notable exceptions of two cases — that tensors of order 2, namely, matrices, always have best approximations of arbitrary low ranks and that tensors of any order always have the best rank-one approximation, it is known that high-order tensors may fail to have best low rank approximations. When the condition of orthogonality is imposed, even under the modest assumption that only one set...
متن کاملDiscretized Dynamical Low-Rank Approximation in the Presence of Small Singular Values
Low-rank approximations to large time-dependent matrices and tensors are the subject of this paper. These matrices and tensors are either given explicitly or are the unknown solutions of matrix and tensor differential equations. Based on splitting the orthogonal projection onto the tangent space of the low-rank manifold, novel time integrators for obtaining approximations by low-rank matrices a...
متن کاملDiscretized Dynamical Low-rank Approximation in the Presence of Small Singular
Low-rank approximations to large time-dependent matrices and tensors are the subject of this paper. These matrices and tensors either are given explicitly or are the unknown solutions of matrix and tensor differential equations. Based on splitting the orthogonal projection onto the tangent space of the low-rank manifold, novel time integrators for obtaining approximations by low-rank matrices a...
متن کاملTensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
There has been continued interest in seeking a theorem describing optimal low-rank approximations to tensors of order 3 or higher, that parallels the Eckart–Young theorem for matrices. In this paper, we argue that the naive approach to this problem is doomed to failure because, unlike matrices, tensors of order 3 or higher can fail to have best rank-r approximations. The phenomenon is much more...
متن کاملBest subspace tensor approximations
In many applications such as data compression, imaging or genomic data analysis, it is important to approximate a given tensor by a tensor that is sparsely representable. For matrices, i.e. 2-tensors, such a representation can be obtained via the singular value decomposition which allows to compute the best rank k approximations. For t-tensors with t > 2 many generalizations of the singular val...
متن کامل